Critique ofq-entropy for thermal statistics

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A critique of non-extensive q-entropy for thermal statistics

Recently there have been numerous articles on a new relation between entropy and probability which is non-extensive, and which has an undetermined parameter q that depends on the nature of the thermodynamic system under consideration. For q = 1 this relation corresponds to the Boltzmann-Gibbs entropy, but for other values of q it is claimed that this relation leads to a formalism which is also ...

متن کامل

Shannon entropy in generalized order statistics from Pareto-type distributions

In this paper, we derive the exact analytical expressions for the Shannon entropy of generalized orderstatistics from Pareto-type and related distributions.

متن کامل

Entropy Properties of Certain Record Statistics and Some Characterization Results

In this paper, the largest and the smallest observations are considered, at the time when a new record of either kind (upper or lower) occurs based on a sequence of independent random variables with identical continuous distributions. We prove that sequences of the residual or past entropy of the current records characterizes F in the family of continuous distributions. The exponential and the ...

متن کامل

Frequentist and Bayesian Statistics: a Critique

There are two broad approaches to formal statistical inference taken as concerned with the development of methods for analysing noisy empirical data and in particular as the attaching of measures of uncertainty to conclusions. The object of this paper is to summarize what is involved. The issue is this. We have data represented collectively by y and taken to be the observed value of a vector ra...

متن کامل

Relative Entropy and Statistics

My greatest concern was what to call it. I thought of calling it “information”, but the word was overly used, so I decided to call it “uncertainty”. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already h...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Physical Review E

سال: 2003

ISSN: 1063-651X,1095-3787

DOI: 10.1103/physreve.67.036114